Executive Summary
iWorQ, a leading software provider for local government agencies, tasked my team with automating and re-designed an existing dashboard that tracks usage and account activity for their various applications. The goal was to build on the prior work and introduce new functionality to help iWorQ make data-driven decisions about user engagement, cancellations, and active usage trends.
Project Scope
The key objectives of the project were:
- Clickable application types: Provide the ability to view usage data by application (e.g., Permit, Code Enforcement), with details on accounts using each application frequently or infrequently.
- Spikes in usage: Identify changes in usage behavior, such as sudden spikes or drop-offs.
- Most active accounts: Highlight accounts that are highly engaged across multiple applications.
- User Information: Enable users to access contact details and usage information for each account and application.
- Automated Updates: Implement real-time or periodic (weekly/monthly) updates to ensure the data reflects the latest user activity and cancellations.
- Cancellation and inactivity tracking: Accurately account for cancellations, ensuring inactive applications are not mistaken for under-utilization.
Challenges with the project
The planning of this project was quite an experience because we were meeting with the company every other week. At the beginning of the process we were looking to track time more than clicks but after some time and meetings we were able to focus on a new unit of analysis. One of the most important challenges we got apart from the unit of analysis was also that during the middle of the project, the main contact in the company was let go. This left us around 3 weeks in the air not being able to address more of the things we were able to custimize with them, but during that time we were able to implement the features of prediction for the low usage accounts.
Technical Process
We used Python, specifically the awswrangler and boto3 libraries, to pull data from an AWS S3 bucket. The S3 bucket contained partitioned Parquet files with historical usage data, spanning over two years. The code I developed allowed us to extract this data efficiently by filtering the relevant data using year and week partitions, ensuring only necessary data was retrieved and processed.
The filtering logic was the hardest part of this, as it make sure that data from the appropriate time frame (two years back) was pulled without loading unnecessary records.
The Dashboard
The final dashboard provides:
- Overview of Usage Data: For each application, users can view which accounts are using it the most or the least.
- Active Accounts and Users: A list of the most active accounts, including detailed information about which applications are being used and the key contacts associated with each.
- Trend Identification: The dashboard automatically flags usage trends, showing increases or decreases in account activity.
- Automated Data Refresh: The dashboard is updated on a scheduled basis (weekly or monthly), pulling the latest data from the S3 bucket to reflect real-time usage.
Page 1
Account App provides information on the usage of the apps, having filters that would bring the Top 10, Bottom 10, and a summary.
Page 2
Usage over time, this vision helps the people at iWorq to see the usage overtime from differents applications, this page helps understand the trends of usage and it lets you see the spikes in usage they were interested in seeing.
Page 3
Changes in usage page, the page that helps them understand the records created but this vision helps understand the usage over the 2 years.
Page 4
This last page, brings the crucial part of helping them understand the low accounts and the inactive accounts. This feature came up during the planning of what we needed to do.